Why Work for Us
Grubhub is a place where authentically fun culture meets innovation and teamwork. We believe in empowering people and opening doors for new opportunities. If you’re looking for a place that values strong relationships, embraces diverse ideas–all while having fun together–Grubhub is the place for you!
The Data Engineer position within Grubhub's Growth team plays a pivotal role in enhancing our big data infrastructure and refining existing pipelines that drive Grubhub's growth marketing efforts. Collaborating closely with Customer Engagement and Analytics teams, the Data Engineer will partner to develop automated marketing campaigns and create custom audience segments for email, push, and/or in-app initiatives. Additionally, they will be responsible for developing and maintaining monitoring dashboards to ensure the effectiveness and quality of campaign delivery. As a Data Engineer, you will be expected to develop a well-informed long-term data architecture, lead structured problem-solving efforts, and remain flexible and adaptable to seize new opportunities as they arise. Our team's responsibilities extend to generating automated analytical reports and metrics that are reviewed at senior levels. Candidates should possess expertise in designing, creating, managing, and leveraging large datasets for business purposes.
This position provides a distinct chance to oversee and develop novel solutions for data storage, pipelining, and visualization starting end-to-end. The pipelines and processes developed in this role will be indispensable for accomplishing business needs. As such, we are seeking an individual who can not only excel in engineering tasks but also demonstrate a keen interest in understanding the business intent of their work. The ideal candidate should possess exceptional attention to detail, strong communication skills, resourcefulness, a customer-centric approach, a team-oriented mindset, the ability to work independently in a fast-paced environment, and have an ownership mindset.
What You Bring to the Table (Qualifications Required):
- Bachelor's degree required in Computer Science/Information technology.
- 3+ years of relevant experience in a data engineering role
- Excellent knowledge of SQL
- 2+ years of experience with Python programming language
- Background in ETL and data processing, familiar with how to transform data to meet our goals i.e. launch automated campaigns, and a monitoring dashboard
- Ability to take on ownership of projects at both a technical and organizational level
- Uses business acumen and judgment to drive towards top line business outcomes and impact
- Sources inputs, perspectives, and risks from multiple partners to arrive at a balanced insight and recommendation
- Strong written and oral communication skills to effectively partner with various functional groups
- Willingness to understand the larger business context
- Strong self-management / prioritization capabilities
- Enthusiasm for data and the desire to work with a great team!
Got these? Even better!
- Master’s degree in Computer Science/Information technology
- Experience with AWS or another cloud service provider
- Understanding of conventional data warehousing principles like 3NF, star schemas, and dimensional modeling
- Have a strong enthusiasm for learning and sharing knowledge
- Thorough attention to detail and precision, with a structured and organized approach
The Impact You Will Make
- Work with high volumes of data and distributed systems using technologies such as Spark, Hive, AWS EMR, AWS S3, Azkaban, Presto
- Designing and implementing robust data pipelines to efficiently process and analyze large volumes of data
- Developing scalable and maintainable data storage solutions to accommodate the growing needs of the organization
- Collaborating with analysts to understand their requirements for metrics and automate them for campaigns launched, so they can scale and evaluate the performance
- Building monitoring and alerting systems to ensure the integrity and availability of data pipelines
- Optimizing query performance and data processing workflows to reduce latency and improve efficiency
- Providing technical expertise and guidance to other team members on best practices for data engineering and architecture
- Continuously evaluating new technologies and tools to enhance the team's capabilities and improve efficiency